252 research outputs found

    Emotion research by the people, for the people

    Get PDF
    Emotion research will leap forward when its focus changes from comparing averaged statistics of self-report data across people experiencing emotion in laboratories to characterizing patterns of data from individuals and clusters of similar individuals experiencing emotion in real life. Such an advance will come about through engineers and psychologists collaborating to create new ways for people to measure, share, analyze, and learn from objective emotional responses in situations that truly matter to people. This approach has the power to greatly advance the science of emotion while also providing personalized help to participants in the research

    Estimating Carotid Pulse and Breathing Rate from Near-infrared Video of the Neck

    Full text link
    Objective: Non-contact physiological measurement is a growing research area that allows capturing vital signs such as heart rate (HR) and breathing rate (BR) comfortably and unobtrusively with remote devices. However, most of the approaches work only in bright environments in which subtle photoplethysmographic and ballistocardiographic signals can be easily analyzed and/or require expensive and custom hardware to perform the measurements. Approach: This work introduces a low-cost method to measure subtle motions associated with the carotid pulse and breathing movement from the neck using near-infrared (NIR) video imaging. A skin reflection model of the neck was established to provide a theoretical foundation for the method. In particular, the method relies on template matching for neck detection, Principal Component Analysis for feature extraction, and Hidden Markov Models for data smoothing. Main Results: We compared the estimated HR and BR measures with ones provided by an FDA-cleared device in a 12-participant laboratory study: the estimates achieved a mean absolute error of 0.36 beats per minute and 0.24 breaths per minute under both bright and dark lighting. Significance: This work advances the possibilities of non-contact physiological measurement in real-life conditions in which environmental illumination is limited and in which the face of the person is not readily available or needs to be protected. Due to the increasing availability of NIR imaging devices, the described methods are readily scalable.Comment: 21 pages, 15 figure

    Understanding Ambulatory and Wearable Data for Health and Wellness

    Get PDF
    In our research, we aim (1) to recognize human internal states and behaviors (stress level, mood and sleep behaviors etc), (2) to reveal which features in which data can work as predictors and (3) to use them for intervention. We collect multi-modal (physiological, behavioral, environmental, and social) ambulatory data using wearable sensors and mobile phones, combining with standardized questionnaires and data measured in the laboratory. In this paper, we introduce our approach and some of our projects

    Recognition of Sleep Dependent Memory Consolidation with Multi-modal Sensor Data

    Get PDF
    This paper presents the possibility of recognizing sleep dependent memory consolidation using multi-modal sensor data. We collected visual discrimination task (VDT) performance before and after sleep at laboratory, hospital and home for N=24 participants while recording EEG (electroencepharogram), EDA (electrodermal activity) and ACC (accelerometer) or actigraphy data during sleep. We extracted features and applied machine learning techniques (discriminant analysis, support vector machine and k-nearest neighbor) from the sleep data to classify whether the participants showed improvement in the memory task. Our results showed 60–70% accuracy in a binary classification of task performance using EDA or EDA+ACC features, which provided an improvement over the more traditional use of sleep stages (the percentages of slow wave sleep (SWS) in the 1st quarter and rapid eye movement (REM) in the 4th quarter of the night) to predict VDT improvement

    SenseGlass: using google glass to sense daily emotions

    Get PDF
    For over a century, scientists have studied human emotions in laboratory settings. However, these emotions have been largely contrived -- elicited by movies or fake "lab" stimuli, which tend not to matter to the participants in the studies, at least not compared with events in their real life. This work explores the utility of Google Glass, a head-mounted wearable device, to enable fundamental advances in the creation of affect-based user interfaces in natural settings.Google (Firm)MIT Media Lab ConsortiumNational Science Foundation (U.S.). Division of Computing and Communication Foundations (NSF CCF-1029585

    Modeling Subjective Experience-Based Learning under Uncertainty and Frames

    Get PDF
    In this paper we computationally examine how subjective experience may help or harm the decision maker's learning under uncertain outcomes, frames and their interactions. To model subjective experience, we propose the "experienced-utility function" based on a prospect theory (PT)-based parameterized subjective value function. Our analysis and simulations of two-armed bandit tasks present that the task domain (underlying outcome distributions) and framing (reference point selection) influence experienced utilities and in turn, the "subjective discriminability" of choices under uncertainty. Experiments demonstrate that subjective discriminability improves on objective discriminability by the use of the experienced-utility function with appropriate framing for a given task domain, and that bigger subjective discriminability leads to more optimal decisions in learning under uncertainty.Massachusetts Institute of Technology. Media Laborator

    Acted vs. natural frustration and delight: many people smile in natural frustration

    Get PDF
    This work is part of research to build a system to combine facial and prosodic information to recognize commonly occurring user states such as delight and frustration. We create two experimental situations to elicit two emotional states: the first involves recalling situations while expressing either delight or frustration; the second experiment tries to elicit these states directly through a frustrating experience and through a delightful video. We find two significant differences in the nature of the acted vs. natural occurrences of expressions. First, the acted ones are much easier for the computer to recognize. Second, in 90% of the acted cases, participants did not smile when frustrated, whereas in 90% of the natural cases, participants smiled during the frustrating interaction, despite self-reporting significant frustration with the experience. This paper begins to explore the differences in the patterns of smiling that are seen under natural frustration and delight conditions, to see if there might be something measurably different about the smiles in these two cases, which could ultimately improve the performance of classifiers applied to natural expressions

    Crowd-powered positive psychological interventions

    Get PDF
    Recent advances in crowdsourcing have led to new forms of assistive technologies, commonly referred to as crowd-powered devices. To best serve the user, these technologies crowdsource human intelligence as needed, when automated methods alone are insufficient. In this paper, we provide an overview of how these systems work and how they can be used to enhance technological interventions for positive psychology. As a specific example, we describe previous work that crowdsources positive reappraisals, providing users timely and personalized suggestions for ways to reconstrue stressful thoughts and situations. We then describe how this approach could be extended for use with other positive psychological interventions. Finally, we outline future directions for crowd-powered positive psychological interventions
    • …
    corecore